Skip to content

Conversation

likholat
Copy link
Contributor

Found issues:
173476
173478

@github-actions github-actions bot added the category: WWB PR changes WWB label Sep 11, 2025
@Wovchena Wovchena requested a review from Copilot September 11, 2025 15:05
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds LoRA (Low-Rank Adaptation) support to the Who What Benchmark text generation pipeline, addressing issues CVS-173476 and CVS-173478. The changes enable loading and using LoRA adapters with both OpenVINO GenAI and HuggingFace Transformers backends.

  • Added LoRA adapter configuration support for OpenVINO GenAI and HuggingFace pipelines
  • Enhanced text generation functions to handle empty adapter configurations
  • Updated help text to clarify LoRA adapter support for Text Generation and Image Generation pipelines

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.

File Description
tools/who_what_benchmark/whowhatbench/wwb.py Updated help text and enhanced genai_gen_text function with empty_adapters parameter
tools/who_what_benchmark/whowhatbench/text_evaluator.py Added empty_adapters parameter support to TextEvaluator class
tools/who_what_benchmark/whowhatbench/model_loaders.py Implemented LoRA adapter loading for both OpenVINO GenAI and HuggingFace pipelines

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

exit(-1)

adapter_config = openvino_genai.AdapterConfig()
if "adapters" in kwargs and kwargs["adapters"] is not None:
Copy link
Preview

Copilot AI Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potential KeyError if 'alphas' is not present in kwargs. The code assumes kwargs['alphas'] exists when adapters are provided, but there's no validation to ensure this key exists.

Suggested change
if "adapters" in kwargs and kwargs["adapters"] is not None:
if "adapters" in kwargs and kwargs["adapters"] is not None:
if "alphas" not in kwargs or kwargs["alphas"] is None:
raise ValueError("If 'adapters' are provided, 'alphas' must also be provided in kwargs.")

Copilot uses AI. Check for mistakes.

assert len(alphas) == len(adapter_names), "`alphas` must be the same length as `adapters`"
model.add_weighted_adapter([f"adapter_{idx}" for idx in range(len(kwargs['adapters']))], alphas, "merged_lora")
else:
model.add_weighted_adapter([f"adapter_{idx}" for idx in range(len(kwargs['adapters']))], "merged_lora")
Copy link
Preview

Copilot AI Sep 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing weights parameter in add_weighted_adapter call. The method expects weights as the second parameter, but 'merged_lora' is passed as the second parameter instead of the adapter_name parameter.

Suggested change
model.add_weighted_adapter([f"adapter_{idx}" for idx in range(len(kwargs['adapters']))], "merged_lora")
# If no alphas provided, use equal weights
model.add_weighted_adapter(
[f"adapter_{idx}" for idx in range(len(kwargs['adapters']))],
[1.0] * len(kwargs['adapters']),
"merged_lora"
)

Copilot uses AI. Check for mistakes.

model_id, trust_remote_code=True, device_map=device.lower(), **model_kwargs
)

if 'adapters' in kwargs and kwargs['adapters'] is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if 'adapters' in kwargs and kwargs['adapters'] is not None:
if kwargs.get("adapters") is not None:


if alphas is not None:
assert len(alphas) == len(adapter_names), "`alphas` must be the same length as `adapters`"
model.add_weighted_adapter([f"adapter_{idx}" for idx in range(len(kwargs['adapters']))], alphas, "merged_lora")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't this [f"adapter_{idx}" for idx in range(len(kwargs['adapters']))] the same as adapter_names ?

logger.info("Using OpenVINO GenAI Continuous Batching API")
scheduler_config = get_scheduler_config_genai(kwargs["cb_config"])
pipeline = openvino_genai.LLMPipeline(model_dir, device=device, scheduler_config=scheduler_config, **ov_config)
pipeline = openvino_genai.LLMPipeline(model_dir, device=device, adapters=adapter_config, scheduler_config=scheduler_config, **ov_config)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are behavior with:

adapter_config = openvino_genai.AdapterConfig()
openvino_genai.LLMPipeline(model_dir, device=device, adapters=adapter_config, **ov_config) 

and
openvino_genai.LLMPipeline(model_dir, device=device, **ov_config)
the same ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is, I'll also add a separate check for the validation team on this


adapter_config = openvino_genai.AdapterConfig()
if "adapters" in kwargs and kwargs["adapters"] is not None:
for adapter, alpha in zip(kwargs['adapters'], kwargs['alphas']):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the alpha value does not match the number of adapters, then here it will be processed and adapters and alphas will be combined, but for hf it will fail. Why ? Is it impossible for hf to combine adapters with and without alphas?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’d just add an assert here too: if the number of adapters and alphas don’t match, it’s unclear which alpha applies to which adapter, most likely the user forgot to provide a weight

exit(-1)

adapter_config = openvino_genai.AdapterConfig()
if "adapters" in kwargs and kwargs["adapters"] is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if "adapters" in kwargs and kwargs["adapters"] is not None:
if kwargs.get("adapters") is not None:


adapter_config = openvino_genai.AdapterConfig()
if "adapters" in kwargs and kwargs["adapters"] is not None:
assert len(kwargs['alphas']) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can adapter be set without alpha for genAI ? if yes, this case needs to be handled as well

Suggested change
assert len(kwargs['alphas']) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"
if kwargs.get('alphas') is not None:
assert len(kwargs['alphas']) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"
for adapter, alpha in zip(kwargs['adapters'], kwargs['alphas']):
....

or if adapters can't be set without alphas

Suggested change
assert len(kwargs['alphas']) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"
assert len(kwargs['alphas'], []) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adapter can be set without alpha for all the cases (genai, optimum, hf).
I don’t really understand the purpose of the suggested change - it already works correctly when alpha is not provided

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if alphas is not provided len(kwargs['alphas']) will be mean len(None) and it should rise TypeError. This doesn't happen now?

Copy link
Contributor

@sbalandi sbalandi Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are right, I have found the place , alphas are converted to list and it never will be None

Then, this check assert len(kwargs['alphas']) == len(kwargs["adapters"]), "alphas must be the same length as adapters" can be moved to https://github.com/openvinotoolkit/openvino.genai/blob/master/tools/who_what_benchmark/whowhatbench/wwb.py#L592 and it doesn't need to keep

model.add_weighted_adapter(adapter_names, "merged_lora")
since this code will never be executed

@Wovchena Wovchena requested a review from Copilot September 15, 2025 09:30
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.


Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

Comment on lines +67 to +69
if kwargs.get("adapters") is not None:
assert len(kwargs['alphas']) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"
for adapter, alpha in zip(kwargs['adapters'], kwargs['alphas']):
Copy link
Preview

Copilot AI Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This assertion will fail when alphas is None but adapters is provided. The code should check if alphas exists before validating lengths, or handle the None case gracefully.

Suggested change
if kwargs.get("adapters") is not None:
assert len(kwargs['alphas']) == len(kwargs["adapters"]), "`alphas` must be the same length as `adapters`"
for adapter, alpha in zip(kwargs['adapters'], kwargs['alphas']):
adapters = kwargs.get("adapters")
alphas = kwargs.get("alphas")
if adapters is not None:
if alphas is None:
raise ValueError("`alphas` must be provided when `adapters` is specified")
assert len(alphas) == len(adapters), "`alphas` must be the same length as `adapters`"
for adapter, alpha in zip(adapters, alphas):

Copilot uses AI. Check for mistakes.

assert len(alphas) == len(adapter_names), "`alphas` must be the same length as `adapters`"
model.add_weighted_adapter(adapter_names, alphas, "merged_lora")
else:
model.add_weighted_adapter(adapter_names, "merged_lora")
Copy link
Preview

Copilot AI Sep 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The add_weighted_adapter method expects weights as the second parameter, but here a string 'merged_lora' is passed. This should likely be model.add_weighted_adapter(adapter_names, [1.0] * len(adapter_names), 'merged_lora') to provide equal weights when alphas are not specified.

Suggested change
model.add_weighted_adapter(adapter_names, "merged_lora")
model.add_weighted_adapter(adapter_names, [1.0] * len(adapter_names), "merged_lora")

Copilot uses AI. Check for mistakes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: WWB PR changes WWB
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants